- Facebook confirmed that it scans conversations that people have on its Messenger app in order to ensure that the images and links are meeting its “community standards,” Bloomberg reports.
- The topic first drew scrutiny following an interview with Mark Zuckerberg in Vox, where the Facebook CEO told a story about a call he got regarding people who “were trying to spread sensational messages” about Myanmar through Facebook Messenger.
Facebook confirmed on Wednesday that it scans users’ conversations on its Messenger app in order to ensure that the content is meeting “community standards,” according to a recent report by Bloomberg’s Sarah Frier.
The confirmation from Facebook comes a week after the topic first drew scrutiny from privacy activists following CEO Mark Zuckerberg’s interview with Vox’s Ezra Klein, where he described a situation in which the company intervened to prevent abuse on one of its platforms, Facebook Messenger.
“I remember, one Saturday morning I got a phone call and we detected that people were trying to spread sensational messages through [Facebook Messenger] to each side of the conflict,” Zuckerberg said in response to a question by Klein about Facebook’s alleged role in spreading propaganda during the Rohingya crisis, the “conflict” he was referring to.
Since August 2017, Rohingya Muslims have been forced to flee Buddhist-dominated Myanmar because of ethnic cleansing efforts by Myanmar security forces, in what the UN has described as having all the “hallmarks of genocide.” Human rights experts for the U.N. have cited hate speech on Facebook as a means by which people have proliferated anti-Rohingya propaganda, worsening the divide between the two religious groups.
Zuckerberg said that the messages detected by Facebook were "basically telling the Muslims, 'Hey, there's about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.' And then the same thing on the other side."
Facebook's effort to "stop those messages from going through," shows it's addressing one problem - preventing abuse of its platforms - but it also brought up questions about user privacy and how exactly Facebook monitors its Messenger app.
The company has been under a lot of pressure to be more wary of it handles people's personal data ever since reports from The New York Times and The Guardian revealed that data analytics company Cambridge Analytics had improperly obtained and used the data from 50 million Facebook users to influence voters' decisions during the 2016 presidential election.
On Wednesday, Facebook said that it now believed that up to 87 million Facebook users may have had their data harvested by Cambridge Analytica.
Many felt that Facebook didn't take enough action to treat their data responsibly - something Congress will investigate when Zuckerberg testifies about the company's handling of personal data and privacy on April 11.
A Facebook Messenger spokeswoman told Bloomberg that the information gathered from scanning Messenger - which used to be built into Facebook before it was spun out as a separate app in 2014 - isn't used for advertising. Facebook said it uses the same automated tools to scan conversations as it does to monitor the public portion of the social network, and the team jumps in when it's alerted to something out of the ordinary or a clear violation.
"Keeping your messages private is the priority for us," a Facebook Messenger spokeswoman said in a statement to Business Insider. "We protect the community with automated systems that detect things like known images of child exploitation and malware. This is not done by humans." Messenger users do have the option to enable encrypted messaging, but that security feature isn't turned on by default.